Transfer Learning with Joint Distribution Adaptation and Maximum Margin Criterion
نویسندگان
چکیده
منابع مشابه
Joint Maximum Margin and Maximum Entropy Learning of Graphical Models
INFERRING structured predictions based on correlated covariates remains a central problem in many fields, including NLP, computer vision, and computational biology. Typically, both the input covariates and output predictions can be high-dimensional, multi-modal, noisy, partially observable, and bearing latent structures, each of these characteristics adds a degree of complexity to the task of l...
متن کاملDeep Transfer Learning with Joint Adaptation Networks
Deep networks rely on massive amounts of labeled data to learn powerful models. For a target task short of labeled data, transfer learning enables model adaptation from a different source domain. This paper addresses deep transfer learning under a more general scenario that the joint distributions of features and labels may change substantially across domains. Based on the theory of Hilbert spa...
متن کاملMaximum margin learning and adaptation of MLP classifiers
Conventional MLP classifiers used in phonetic recognition and speech recognition may encounter local minima during training, and they often lack an intuitive and flexible adaptation approach. This paper presents a hybrid MLP-SVM classifier and its associated adaptation strategy, where the last layer of a conventional MLP is learned and adapted in the maximum separation margin sense. This struct...
متن کاملBackground Modeling via Incremental Maximum Margin Criterion
Subspace learning methods are widely used in background modeling to tackle illumination changes. Their main advantage is that it doesn’t need to label data during the training and running phase. Recently, White et al. [1] have shown that a supervised approach can improved significantly the robustness in background modeling. Following this idea, we propose to model the background via a supervise...
متن کاملLaplacian Maximum Margin Criterion for Image Recognition
Previous works have demonstrated that Laplacian embedding can well preserve the local intrinsic structure. However, it ignores the diversity and may impair the local topology of data. In this paper, we build an objective function to learn the local intrinsic structure that characterizes both the local similarity and diversity of data, and then combine it with global structure to build a scatter...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Physics: Conference Series
سال: 2019
ISSN: 1742-6588,1742-6596
DOI: 10.1088/1742-6596/1169/1/012028